relu-like activation
To Reviewer1: 1. Method simplistic, places too much constraints on activation (only ReLU-like activations)
We believe the proposed H-regularization is novel and by no means simplistic. It is well suited for one-class learning. ReLU-like activations are widely used, e.g., Transformer, Resnet, etc. It does not affect the application of our method. In our experiments, we followed baselines and used the same datasets as them.
To Reviewer1: 1. Method simplistic, places too much constraints on activation (only ReLU-like activations)
We believe the proposed H-regularization is novel and by no means simplistic. It is well suited for one-class learning. ReLU-like activations are widely used, e.g., Transformer, Resnet, etc. It does not affect the application of our method. In our experiments, we followed baselines and used the same datasets as them.